A Closed Form Solution to Multi-View Low-Rank Regression
نویسندگان
چکیده
Real life data often includes information from different channels. For example, in computer vision, we can describe an image using different image features, such as pixel intensity, color, HOG, GIST feature, SIFT features, etc.. These different aspects of the same objects are often called multi-view (or multi-modal) data. Lowrank regression model has been proved to be an effective learning mechanism by exploring the low-rank structure of real life data. But previous low-rank regression model only works on single view data. In this paper, we propose a multi-view low-rank regression model by imposing low-rank constraints on multi-view regression model. Most importantly, we provide a closed-form solution to the multi-view low-rank regression model. Extensive experiments on 4 multi-view datasets show that the multi-view low-rank regression model outperforms single-view regression model and reveals that multiview low-rank structure is very helpful. Introduction In many tasks, a single object can be described using information from different channels (or views). For example, a 3-D object can be described using pictures from different angles; a website can be described using the words it contains, and the hyperlinks it contains; an image can be described using different features, such as SIFT feature, and HOG feature; in daily life, a person can be characterized using age, height, weight and so on. These data all comes from different aspects and channels. Multi-view problems aim to improve existing single view model by learning a model utilizing data collected from multiple channels (Rüping and Scheffer 2005) (de Sa 2005) (Zhou and Burges 2007). Low-rank regression model has been proved to be an effective learning mechanism by exploring the low-rank structure of real life data (Xiang et al. 2012) (Evgeniou and Pontil 2007) (Cai et al. 2013). Existing regression models only work on single view data. To be specific, linear regression finds a linear model with respect to the single view feature data to fit target class data (Seber and Lee 2012). Let matrix B ∈ <p×c be the parameter of the linear model. Linear regression solves a problem of minB ||Y − XB||F , where Copyright c © 2015, Association for the Advancement of Artificial Intelligence (www.aaai.org). All rights reserved. X = [x1,x2, ...,xn] ∈ <p×n is the single view feature data matrix and Y ∈ <n×c is the target class indicator matrix. Ridge regression can achieve better results by adding a Frobenius norm based regularization on linear regression loss objective (Hoerl and Kennard 1970) (Marquaridt 1970). Ridge regression solves the problem minB ||Y −XB||F + λ||B||F , where λ is the regularization weight parameter. Cai (Cai et al. 2013) showed that whenB is low-rank, regression is equivalent to linear discriminant analysis based regressions. However, all these work only works for single-view problems. In this paper, we propose a multi-view low-rank regression model by imposing low-rank constraints on regression model. This model can be solved using closed-form solution directly. In linear regression, low rank parameter matrix B is dependent on view ν. Through theoretical analysis, we show that multi-view low-rank regression model is equivalent to do regression in the subspace of each view. In other words, let B = AνB, and it is equivalent to find the shared regression parameter matrix B under the subspace transformation Aν with respect to view ν. Extensive experiments performed on 4 multi-view datasets show that the proposed model outperforms single-view regression model and reveals that low-rank structure can improve the classification result of a full-rank model. Notations. In this paper, matrices are written in uppercase letters, such as X, Y . Vectors are written in bold lower case letters, such as x, y. Tr(X) means the trace operation for matrix X . Multi-view Low Rank Regression Assume that there are v views and c classes, pν is the dimension of view ν, nj is the sample size of the j-th class, and n is the total sample size. Let Xν = [x1 , ...,x ν n] ∈ <pν×n be the data matrix of view ν, ν = 1, 2, ..., v, and Y = [y1, ...,yc] ∈ <n×c is the normalized class indicator matrix, i.e. Yij = 1/ √ nj if the i-th data point belongs to the j-th class and Yij = 0 otherwise. We try to minimize the residual of low rank regression model in each class and in each view. Loss function of multiProceedings of the Twenty-Ninth AAAI Conference on Artificial Intelligence
منابع مشابه
A Closed-Form Formula for the Fair Allocation of Gains in Cooperative N-Person Games
Abstract This paper provides a closed-form optimal solution to the multi-objective model of the fair allocation of gains obtained by cooperation among all players. The optimality of the proposed solution is first proved. Then, the properties of the proposed solution are investigated. At the end, a numerical example in inventory control environment is given to demonstrate the application and t...
متن کاملA hybrid solution approach for a multi-objective closed-loop logistics network under uncertainty
The design of closed-loop logistics (forward and reverse logistics) has attracted growing attention with the stringent pressures of customer expectations, environmental concerns and economic factors. This paper considers a multi-product, multi-period and multi-objective closed-loop logistics network model with regard to facility expansion as a facility location–allocation problem, which more cl...
متن کاملRobust Kernelized Multi-View Self-Representations for Clustering by Tensor Multi-Rank Minimization
Most recently, tensor-SVD is implemented on multi-view self-representation clustering and has achieved the promising results in many real-world applications such as face clustering, scene clustering and generic object clustering. However, tensor-SVD based multi-view self-representation clustering is proposed originally to solve the clustering problem in the multiple linear subspaces, leading to...
متن کاملFree Vibration Analysis of a Sloping-frame: Closed-form Solution versus Finite Element Solution and Modification of the Characteristic Matrices (TECHNICAL NOTE)
This article deals with the free vibration analysis and determination of the seismic parameters of a sloping-frame which consists of three members; a horizontal, a vertical, and an inclined member. The both ends of the frame are clamped, and the members are rigidly connected at joint points. The individual members of the frame are assumed to be governed by the transverse vibration theory of an ...
متن کاملA Closed-form Semi-analytical Elastic-Plastic Solution for Predicting the Onset of Flange Wrinkling in Deep-drawing of a Two-layered Circular Plate
In this paper to predict the critical conditions for onset of elastic-plastic wrinkling of flange of a two-layered circular blank during the deep-drawing process a closed-form semi-analytical elastic-plastic solution using Tresca yield criterion alongwith deformation theory in plasticity with considernig the perfectly plastic behaviour of materials is presented. Simplifying the presented soluti...
متن کامل